User-Centered Control of Audio and Visual Expressive Feedback by Full-Body Movements

نویسندگان

  • Ginevra Castellano
  • Roberto Bresin
  • Antonio Camurri
  • Gualtiero Volpe
چکیده

In this paper we describe a system allowing users to express themselves through their full-body movement and gesture and to control in real-time the generation of an audio-visual feedback. The systems analyses in real-time the user’s full-body movement and gesture, extracts expressive motion features and maps the values of the expressive motion features onto real-time control of acoustic parameters for rendering a music performance. At the same time, a visual feedback generated in real-time is projected on a screen in front of the users with their coloured silhouette, depending on the emotion their movement communicates. Human movement analysis and visual feedback generation were done with the EyesWeb software platform and the music performance rendering with pDM. Evaluation tests were done with human participants to test the usability of the interface and the effectiveness of the design.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Believable Visual Feedback in Motor Learning Using Occlusion-based Clipping in Video Mapping

Gait rehabilitation systems provide patients with guidance and feedback that assist them to better perform the rehabilitation tasks. Real-time feedback can guide users to correct their movements. Research has shown that the quality of feedback is crucial to enhance motor learning in physical rehabilitation. Common feedback systems based on virtual reality present interactive feedback in a monit...

متن کامل

How the system works

My visit at KTH aimed to develop a system in which audio-visual feedback is provided to users who are free to move in the space. The feedback depends on the expressivity of the body movements they perform. Specifically, users can render a music performance by controlling in realtime acoustic parameters through their full-body movement and get a visual feedback on a screen in front of them with ...

متن کامل

Developing a dancer sonification system using the Immersive Interactive Sonification Platform (iISoP)

For decades, researchers have spurred research on sonification, the use of non-speech audio to convey information. As the level of automation increases in the workplace, monitoring the state and performance of complex systems can require more cognitive resources than provided through visual information processing channels. In recent years interactive sonification and gesture-based interaction h...

متن کامل

Analysis of the No Return Point Hypothesis: The Effect of Audio and Visual Stimuli in the Fast Movements Inhibition

Background. The No Return Point hypothesis is one of the research areas that has been done in line with the motor program. In this hypothesis emphasized an inability to inhibition move after its start by the motor program. Several factors are affecting the mechanism of this inhibition. Objectives. In this study, we investigate the effects of audio and visual stimuli on blocking quick moves to ...

متن کامل

Hand-Controller for Combined Tactile Control and Motion Tracking

The Hand-Controller is a new interface designed to enable a performer to achieve detailed control of audio and visual parameters through a tangible interface combined with motion tracking of the hands to capture large scale physical movement. Such movement empowers an expressive dynamic for both performer and audience. However movement in free space is notoriously difficult for virtuosic perfor...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007